• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸°úÇÐȸ ³í¹®Áö > Á¤º¸°úÇÐȸ³í¹®Áö (Journal of KIISE)

Á¤º¸°úÇÐȸ³í¹®Áö (Journal of KIISE)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) ÇÕ¼º°ö ½Å°æ¸ÁÀ» À§ÇÑ Elastic Multiple Parametric Exponential Linear Units
¿µ¹®Á¦¸ñ(English Title) Elastic Multiple Parametric Exponential Linear Units for Convolutional Neural Networks
ÀúÀÚ(Author) ±è´ëÈ£   ±èÀçÀÏ   Daeho Kim   Jaeil Kim  
¿ø¹®¼ö·Ïó(Citation) VOL 46 NO. 05 PP. 0469 ~ 0477 (2019. 05)
Çѱ۳»¿ë
(Korean Abstract)
È°¼ºÈ­ ÇÔ¼ö´Â ½Å°æ¸Á ¸ðµ¨ÀÇ ºñ¼±Çü¼º°ú ±íÀ̸¦ °áÁ¤ÇÏ´Â Áß¿äÇÑ ¿ä¼ÒÀÌ´Ù. Rectified Linear Units (ReLU)°¡ Á¦¾ÈµÈ ÀÌÈÄ, Æò±Õ°ªÀ» 0¿¡ °¡±õ°Ô ÇÏ¿© ÇнÀÀÇ ¼Óµµ¸¦ ³ôÀÎ Exponential Linear Units(ELU)³ª ÇÔ¼ö ±â¿ï±â¿¡ º¯È­¸¦ ÁÖ¾î ¼º´ÉÀ» Çâ»ó½ÃŲ Elastic Rectified Linear Units (EReLU)°°Àº ´Ù¾çÇÑ ÇüÅÂÀÇ È°¼ºÈ­ ÇÔ¼ö°¡ ¼Ò°³µÇ¾ú´Ù. ¿ì¸®´Â ¼­·Î ´Ù¸¥ ELU¿Í EReLU¸¦ ÀϹÝÈ­ÇÑ ÇüÅÂÀÇ È°¼ºÈ­ ÇÔ¼öÀÎ Elastic Multiple Parametric Exponential Linear Units (EMPELU)¸¦ Á¦¾ÈÇÑ´Ù. EMPELU´Â ¾ç¼ö ¿µ¿ª¿¡¼­´Â ÀÓÀÇÀÇ ¹üÀ§·Î ±â¿ï±â º¯µ¿À» Áָ鼭, À½¼ö ¿µ¿ªÀº ÇнÀ ÆĶó¹ÌÅ͸¦ ÀÌ¿ëÇØ ´Ù¾çÇÑ ÇüÅÂÀÇ È°¼ºÈ­ ÇÔ¼ö¸¦ Çü¼ºÇϵµ·Ï ÇÏ¿´´Ù. EMPELU´Â ÇÕ¼º°ö ¸ðµ¨ ±â¹Ý CIFAR-10/100ÀÇ À̹ÌÁö ºÐ·ù¿¡¼­ ±âÁ¸ È°¼ºÈ­ ÇÔ¼ö¿¡ ºñÇØ Á¤È®µµ ¹× ÀϹÝÈ­¿¡¼­ Çâ»óµÈ ¼º´ÉÀ» º¸¿´´Ù.
¿µ¹®³»¿ë
(English Abstract)
Activation function plays a major role in determining the depth and non-linearity of neural networks. Since the introduction of Rectified Linear Units for deep neural networks, many variants have been proposed. For example, Exponential Linear Units (ELU) leads to faster learning as pushing the mean of the activations closer to zero, and Elastic Rectified Linear Units (EReLU) changes the slope randomly for better model generalization. In this paper, we propose Elastic Multiple Parametric Exponential Linear Units (EMPELU) as a generalized form of ELU and EReLU. EMPELU changes the slope for the positive part of the function argument randomly within a moderate range during training, and the negative part can be dealt with various types of activation functions by its parameter learning. EMPELU improved the accuracy and generalization performance of convolutional neural networks in the object classification task (CIFAR-10/100), more than well-known activation functions.
Å°¿öµå(Keyword) Elastic Multiple Parametric Exponential Linear Uni   È°¼ºÈ­ ÇÔ¼ö   ÇÕ¼º°ö ½Å°æ¸Á   À̹ÌÁö ºÐ·ù   µö·¯´×   Elastic Multiple Parametric Exponential Linear Uni   activation function   convolutional neural network   image classification   deep learning  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå